Open Access Journals

Explore All Journals...

Recent Articles >> More

ITD-GMJN: Insider Thread Detection in Cloud Computing using Golf Optimized MICE based Jordan Neural Network

By B. GAYATHRI

DOI: https://doi.org/10.5815/ijcnis.2025.06.08, Pub. Date: 8 Dec. 2025

Cloud computing refers to a high-level network architecture that allows consumers, authorized users, owners, and users to swiftly access and store their data. These days, the user's internal risks have a significant impact on this cloud. An intrusive party is established as a network member and presented as a user. Once they have access to the network, they will attempt to attack or steal confidential information while others are exchanging information or conversing. For the cloud network's external security, there are numerous options. But it's important to deal with internal or insider threats. Thus in the proposed work, an advanced deep learning with optimized missing value imputation is developed to mitigate insider thread in the cloud system. Behavioral log files were taken in an organization which is split into sequential data and standalone data based on the login process. This data was not ready for the detection process due to improper data samples so it was pre-processed using Multivariate Imputation by Chained Equations (MICE) imputation. In this imputation process, the estimation parameter was optimally chosen using the Golf Optimization Algorithm (GOA). After the missing values were filled, the data proceeded to the extraction process. In this, the sequential data are proceeded for the domain extractor and the standalone data are proceeded for Long Short-Term Memory-Autoencoder (LS-AE). Both features are fused to create a single data which is further given to the detection process using Jordan Neural Network (JNN). The proposed method offers 96% accuracy, 92% recall, 91.6% specificity, 8.39% fall out and 8% Miss Rate. The results showed that the recommended JNN detection model has successfully detected insider threads in a cloud system. 

[...] Read more.
Cocoa Land Mapping Based on Geographic Information System in East Kalimantan Using Leaflet JS and GeoJSON

By Reza Andrea Agus Ganda Permana Aulia Khoirunnita

DOI: https://doi.org/10.5815/ijeme.2025.06.05, Pub. Date: 8 Dec. 2025

East Kalimantan Province holds significant potential in developing cocoa commodities, a key sector in regional plantation development. However, there is a lack of an effective, web-based information system to visualize and manage data related to cocoa land distribution. This limitation hampers access for both the public and policymakers. To address this issue, a web-based Geographic Information System (GIS) application was developed using Leaflet JS and GeoJSON. This application aims to provide an interactive map that can display the location of cocoa plantations in the distribution of East Kalimantan Province, especially in Kutai Kartanegara, East Kutai, West Kutai, Mahakam Ulu, Penajam Paser Utara, Paser, Berau, Samarinda, Balikpapan, and Bontang, as well as providing information about the land area and cocoa production. Spatial data and cocoa production data are integrated into this application so that users, including farmers, entrepreneurs, and governments, can obtain comprehensive information about cocoa commodities in the East Kalimantan Province. By visualizing cocoa land distribution, the GIS tool is expected to support informed policymaking aimed at improving productivity and land use efficiency. It also serves to raise public awareness of cocoa potential in local communities.

[...] Read more.
A Novel Resource and Distribution Aware Random Forest for Agricultural Productivity Prediction

By Harendra Singh Negi Sushil Chandra Dimri

DOI: https://doi.org/10.5815/ijem.2025.06.04, Pub. Date: 8 Dec. 2025

Agriculture has continued being one of the economic powerhouses of India, but then the productivity is usually compromised due to the poor utilization of soil and environment data. This paper is a proposal of a new framework named Distribution and Resource Aware Random Forest (DRARF) to be used in smart farming applications. The strategy combines IoT-ready soil data that comprises of moisture, temperatures, humidity, pH, and NPK that are monitored via different sources and used to make crop-specific decisions. The DRARF presents two important novel features to traditional Random Forests: (i) distribution-aware threshold selection, which guarantees statistical meaningful data partition and (ii) resource-aware feature selection, which gives more predictive power without the expense of buying sensors in the IoT. It assessed the framework using soil and environmental data of wheat and rice. The comparative tasks performed using Logistic Regression, Support Vector Machine, Naïve Bayes, and the classical random Forest have shown that DRARF not only provides a better accuracy, precision, recall and F1-scores, but it also minimizes sensor redundancy. Its potential depends on its scalability, efficiency, and reliability as a precision agricultural decision-support system and this, as well as the remaining results, are reflected in the results. The given approach with machine learning and IoT-facilitated sensing source-based solutions can bring the advancements in the sphere of smart farming technologies to help increase the yield of crops and resources and improve long-term food security.

[...] Read more.
A Fast Output Generating Set Partitioning in Hierarchical Trees Coding for Medical Image Compression

By Narayana Prakash S. Airani Mohammad Khan

DOI: https://doi.org/10.5815/ijigsp.2025.06.10, Pub. Date: 8 Dec. 2025

In this paper, we have presented Discrete Wavelet Transform (DWT) based Fast Output Generating Set Partitioning in Hierarchical Trees (FOGSPIHT) algorithm for MRI brain image compression. The FOGSPIHT is scalable, faster, and robust algorithm. Image compression is an important technique that enables fast and high throughput imaging applications by reducing the storage space or transmission bandwidth. DWT transforms the image to get a set of coefficients that are used for efficient compression. The Set Partitioning In Hierarchical Trees (SPIHT) algorithm is an efficient algorithm used for DWT based image compression. The limitations of SPIHT coding are the complexity and memory requirements. To reduce the complexity, we propose the FOGSPIHT algorithm that works on the basic principles of SPIHT. The FOGSPIHT algorithm works on coefficients that are converted to bit planes. FOGSPIHT eliminates the comparison operations in the compression process of SPIHT by simple logical operations on bits. The values of Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), and Structural Similarity Index Measure (SSIM) are calculated and plotted against Compression Ratio (CR). The result obtained with the FOGSPIHT algorithm is equal to or better than the SPIHT algorithm. The FOGSPIHT algorithm is faster which has reduced encoding and decoding time. The implementation of the FOGSPIHT algorithm with an 8x8 image DWT coefficient on FPGA requires the lower amount of resource and power requirements in comparison with the SPIHT algorithm.

[...] Read more.
Innovative Privacy Preserving Strategies in Federal Learning

By Deny P. Francis R. Sharmila

DOI: https://doi.org/10.5815/ijieeb.2025.06.09, Pub. Date: 8 Dec. 2025

Federated Learning (FL) enables collaborative model training across distributed clients without sharing raw data, but it remains vulnerable to privacy risks. This study introduces FL-ODP-DFT, a novel framework that integrates Optimal Differential Privacy (ODP) with Discrete Fourier Transform (DFT) to enhance both model performance and privacy. By transforming local gradients into the frequency domain, the method reduces data size and adds a layer of encryption before transmission. Adaptive Gaussian Clipping (AGC) is employed to dynamically adjust clipping thresholds based on gradient distribution, further improving gradient handling. ODP then calibrates noise addition based on data sensitivity and privacy budgets, ensuring a balance between privacy and accuracy. Extensive experiments demonstrate that FL-ODP-DFT outperforms existing techniques in terms of accuracy, computational efficiency, convergence speed, and privacy protection, making it a robust and scalable solution for privacy-preserving FL.

[...] Read more.
Quantum–inspired Methods for Training Machine Learning Models

By Nilesh T. Fonseka Anuradha Mahasinghe

DOI: https://doi.org/10.5815/ijitcs.2025.06.08, Pub. Date: 8 Dec. 2025

Machine learning model training, which ultimately optimizes a model’s cost function is usually a time- consuming and computationally intensive process on classical computers. This has been more intense due to the in- creased demand for large-scale data analysis, requiring unconventional computing paradigms like quantum computing to enhance training efficiency. Adiabatic quantum computers have excelled at solving optimization problems, which require the quadratic unconstrained binary optimization (QUBO) format of the problem of interest. In this study, the squared error minimization in the multiple linear regression model is reformulated as a QUBO problem enabling it to be solved using D-wave adiabatic quantum computers. Same formulation was used to obtain a solution using gate-based algorithms such as quantum approximate optimization algorithm (QAOA) and sampling variational quantum eigensolver (VQE) im- plemented via IBM Qiskit. The results obtained through these approaches in the context of runtime and mean squared error(MSE) were analyzed and compared to the classical approaches. Our experimental results indicate a runtime ad- vantage in the D-wave annealing approach over the classical Scikit learn regression approach. The time advantage can be observed when N>524288 compared to Sklearn Linear Regression and when N>65536  compared to Sklearn SGDRegressor. Support vector machine induced neural networks, where the margin-based entropy loss is converted into a QUBO with Lagrangian approach is also focused in this study concerning the applicability for nonlinear models.

[...] Read more.
Performance Analysis of Deep Learning Techniques for Multi-Focus Image Fusion

By Ravpreet Kaur Sarbjeet Singh

DOI: https://doi.org/10.5815/ijisa.2025.06.05, Pub. Date: 8 Dec. 2025

Multi-Focus Image Fusion (MFIF) plays an important role in the field of computer vision. It aims to merge multiple images that possess different focus depths, resulting in a single image with a focused appearance. Though deep learning based methods have demonstrated development in the MFIF field, they vary significantly with regard to fusion quality and robustness to different focus changes. This paper presents the performance analysis of three deep learning-based MFIF methods specifically ECNN (Ensemble based Convolutional Neural Network), DRPL (Deep Regression Pair Learning) and SESF-Fuse. These techniques have been selected due to their publicly availability of training and testing source code, facilitating a thorough and reproducible analysis along with their diverse architectural approaches to MFIF. For training, three datasets were used ILSVRC2012, COCO2017, and DIV2K. The performance of the techniques was evaluated on two publicly available MFIF datasets: Lytro and RealMFF datasets using four objective evaluation metrics viz. Mutual Information, Gradient based metric, Piella metric and Chen-Varshney metric. Extensive experiments were conducted both qualitatively and quantitatively to analyze the effectiveness of each technique in terms of preserving details, artifacts reduction, consistency at the boundary region, texture fidelity etc. which jointly determine the feasibility of these methods for real-world applications. Ultimately, the findings illuminate the strengths and limitations of these deep learning approaches, providing valuable insights for future research and development in methodologies for MFIF.

[...] Read more.
Convolutional Neural Network Approach for Identity Verification in Computer-Based Testing Exams in Nigeria

By Ogochukwu C. Okeke Anthony T. Umerah Ike J. Mgbeafulike Osita M. Nwakeze

DOI: https://doi.org/10.5815/ijmsc.2025.04.05, Pub. Date: 8 Dec. 2025

Computer-Based Testing (CBT) has gained prominence in Nigeria due to its efficiency and scalability in evaluating students across various educational institutions. However, various forms of exam cheating, such as candidate swapping and unauthorised assistance, threaten its integrity. This research explores the application of Convolutional Neural Networks (CNNs) for identity verification in Nigerian CBT environments and presents a CNN-driven facial biometric model based on the findings. The model extracts facial features of examinees from real-time videos of CBT exam sessions, and it compares them with pre-registered data to verify test takers' identities, as well as to detect and report instances of candidate swapping and unauthorised assistance during the ongoing exam. The model is trained on diverse datasets like VGGFace2 and CASIA African Face Dataset to enhance fairness and accuracy for African demographics. This ensures effectiveness in Nigerian Computer-Based Testing (CBT) and local contexts. Evaluation of the model and its comparative analysis with existing systems and other biometric methods were performed. The assessment involved 2,000 genuine and 3000 impostor samples, achieving 99.52% accuracy with high precision and recall of 0.998 and 0.99, respectively. The results demonstrate the model’s high accuracy, low false acceptance, and minimal false rejection rates, and highlight the model’s viability in maintaining exam integrity and accessibility.

[...] Read more.
A Structured Project-Based Learning Pedagogy to Bridge the Theory–Practical Gap in Blockchain Education

By Rachana Yogesh Patil Rahul Kulkarni

DOI: https://doi.org/10.5815/ijmecs.2025.06.06, Pub. Date: 8 Dec. 2025

Traditional educational institutions prioritize theoretical education over hands-on practical skills which produce a gap between classroom learning and industry requirements particularly in the fast-growing blockchain sector. A structured case study demonstrates how Project-Based Learning (PBL) was implemented in an undergraduate engineering course which focused on blockchain technology. The educational approach evolved through four stages that combined theoretical instruction with collaborative solution creation and DApp programming and assessment evaluation. Performance metrics from students including testing coverage, GitHub contributions, documentation quality and research paper output are carefully analyzed through algorithmic guidance of each phase. The paper demonstrates the development of teaching methods through traditional practices and outcome-based instruction up to project-based learning supported by visual timeline comparisons. Student feedback demonstrates that the education methods led to enhanced technical abilities together with teamworking and increased student confidence. The case study demonstrates how PBL functions as an educational connection between academic learning and practical blockchain development needs because most teams (over 75%) finished functional DApps alongside several groups producing research suitable for publication.

[...] Read more.
Cybersecurity in Philippine Aviation: A Multi-Method Evaluation of Vulnerabilities and Mitigation Strategies Through Document Analysis, Case Study, and Risk Modeling

By Arthur Dela Pena

DOI: https://doi.org/10.5815/ijwmt.2025.06.04, Pub. Date: 8 Dec. 2025

The digitalization of aviation has heightened exposure to cyber risk, yet Philippine aviation governance and practice remain fragmented. This study evaluates sectoral vulnerabilities and feasible mitigations using a multi-method design: (i) document analysis of CAAP circulars, DICT’s National Cybersecurity Plan 2022, and international guidance (ICAO, IATA, NIST, ISO/IEC 27001); (ii) case studies (Cathay Pacific breach; London Heathrow USB mishandling) chosen for analytic transferability to Philippine operations; and (iii) risk modeling via a likelihood–impact matrix with a transparent 1–5 rubric adapted from ICAO SMM, NIST SP 800-30, and DICT, scored independently by two researchers with consensus reconciliation. I integrate results through a SWOT–TOWS synthesis and propose an AI/ML feasibility roadmap tailored to on-prem/air-gapped constraints. Findings reveal high-priority risks, including unauthorized ATC access, reservation-system data breaches, and airport-network ransomware (ris score = 20), driven by monitoring gaps, legacy systems, and uneven policy enforcement. Moderately ranked threats (weak framework implementation; phishing) and under-analyzed insider risk reflect systemic and human-factor weaknesses, compounded by underreporting and limited inter-agency coordination. The study’s novel contribution is a localization map that operationalizes global frameworks for Philippine conditions: phased NIST CSF adoption, tiered ISO/IEC 27001 pathways, and ICAO-aligned CAAP–DICT coordination with centralized incident reporting; plus a staged, low-cost AI/ML roadmap with KPI tracking (MTTD/MTTR, precision/recall). Limitations include the absence of primary stakeholder data and local incident/cost series; we outline a quantitative extension using operator surveys and Expected Annual Loss modeling to strengthen future empirical grounding. The results inform regulators, airlines, and airports on risk-based prioritization and practical governance upgrades to enhance national aviation cyber resilience.

[...] Read more.

More...